Stochastic Digital Backpropagation With Residual Memory Compensation

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Memory-Efficient Backpropagation Through Time

We propose a novel approach to reduce memory consumption of the backpropagation through time (BPTT) algorithm when training recurrent neural networks (RNNs). Our approach uses dynamic programming to balance a trade-off between caching of intermediate results and recomputation. The algorithm is capable of tightly fitting within almost any user-set memory budget while finding an optimal execution...

متن کامل

MuProp: Unbiased Backpropagation for Stochastic Neural Networks

Deep neural networks are powerful parametric models that can be trained efficiently using the backpropagation algorithm. Stochastic neural networks combine the power of large parametric functions with that of graphical models, which makes it possible to learn very complex distributions. However, as backpropagation is not directly applicable to stochastic networks that include discrete sampling ...

متن کامل

Impairment mitigation in superchannels with digital backpropagation and MLSD.

We assess numerically the performance of single-carrier digital backpropagation (SC-DBP) and maximum-likelihood sequence detection (MLSD) for DP-QPSK and DP-16QAM superchannel transmission over dispersion uncompensated links for three different cases of spectral shaping: optical pre-filtering of RZ and NRZ spectra, and digital Nyquist filtering. We investigate the limits for carrier proximity o...

متن کامل

Backpropagation Training Algorithm with Adaptive Parameters to Solve Digital Problems

An efficient technique namely Backpropagation training with adaptive parameters using Lyapunov Stability Theory for training single hidden layer feed forward network is proposed. A three-layered Feedforward neural network architecture is used to solve the selected problems. Sequential Training Mode is used to train the network. Lyapunov stability theory is employed to ensure the faster and stea...

متن کامل

Stochastic Backpropagation through Mixture Density Distributions

The ability to backpropagate stochastic gradients through continuous latent distributions has been crucial to the emergence of variational autoencoders [4, 6, 7, 3] and stochastic gradient variational Bayes [2, 5, 1]. The key ingredient is an unbiased and low-variance way of estimating gradients with respect to distribution parameters from gradients evaluated at distribution samples. The “repar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Lightwave Technology

سال: 2016

ISSN: 0733-8724,1558-2213

DOI: 10.1109/jlt.2015.2477462